11 research outputs found

    MatlabMPI

    Full text link
    The true costs of high performance computing are currently dominated by software. Addressing these costs requires shifting to high productivity languages such as Matlab. MatlabMPI is a Matlab implementation of the Message Passing Interface (MPI) standard and allows any Matlab program to exploit multiple processors. MatlabMPI currently implements the basic six functions that are the core of the MPI point-to-point communications standard. The key technical innovation of MatlabMPI is that it implements the widely used MPI ``look and feel'' on top of standard Matlab file I/O, resulting in an extremely compact (~250 lines of code) and ``pure'' implementation which runs anywhere Matlab runs, and on any heterogeneous combination of computers. The performance has been tested on both shared and distributed memory parallel computers (e.g. Sun, SGI, HP, IBM, Linux and MacOSX). MatlabMPI can match the bandwidth of C based MPI at large message sizes. A test image filtering application using MatlabMPI achieved a speedup of ~300 using 304 CPUs and ~15% of the theoretical peak (450 Gigaflops) on an IBM SP2 at the Maui High Performance Computing Center. In addition, this entire parallel benchmark application was implemented in 70 software-lines-of-code, illustrating the high productivity of this approach. MatlabMPI is available for download on the web (www.ll.mit.edu/MatlabMPI).Comment: Download software from http://www.ll.mit.edu/MatlabMPI, 12 pages including 7 color figures; submitted to the Journal of Parallel and Distributed Computin

    Campus Bridging: Campus Leadership Engagement in Building a Coherent Campus Cyberinfrastructure Workshop Report

    Get PDF
    This report presents the discussions at and recommendations made at “Campus Leadership Engagement in Building a Coherent Campus Cyberinfrastructure,” a workshop held in Anaheim, California from October 10-12, 2010. The main goals for this workshop focused on gathering the thoughts, ideas and perspectives of senior university administrators. The resulting report covers the topics of: - The current state of campus bridging from the perspectives of the CIO and VP for Research. - Challenges and opportunities at the campus leader level for enablement of campus bridging in the university community. - The senior campus leadership advocacy role for promoting campus bridging.This workshop and preparation of this report and related documents were supported by several sources, including: National Science Foundation through grant #OCI-1059812 (Patrick Dreher PI; Craig A. Stewart; James Pepin; Guy Almes; Michael Mundrane Co-PIs) (Co-Principal Investigator) RENCI (the Renaissance Computing Institute, http://www.renci.org/) supported this workshop and report by generously providing the time and effort of Patrick Dreher and through underwriting of this effort by RENCI Director Stanley Ahalt Indiana University Pervasive Technology Institute (http://pti.iu.edu/) for funding staff providing logistical support of the task force activities, writing and editorial staff, and layout and production of the final report document. Texas A&M University (http://www.tamu.edu) supported this workshop and report by generously providing the time and effort of Guy Almes. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation, the Indiana University Pervasive Technology Institute, or Indiana University

    Developing a Coherent Cyberinfrastructure from Local Campus to National Facilities: Challenges and Strategies

    Get PDF
    A fundamental goal of cyberinfrastructure (CI) is the integration of computing hardware, software, and network technology, along with data, information management, and human resources to advance scholarship and research. Such integration creates opportunities for researchers, educators, and learners to share ideas, expertise, tools, and facilities in new and powerful ways that cannot be realized if each of these components is applied independently. Bridging the gap between the reality of CI today and its potential in the immediate future is critical to building a balanced CI ecosystem that can support future scholarship and research. This report summarizes the observations and recommendations from a workshop in July 2008 sponsored by the EDUCAUSE Net@EDU Campus Cyberinfrastructure Working Group (CCI) and the Coalition for Academic Scientific Computation (CASC). The invitational workshop was hosted at the University Place Conference Center on the IUPUI campus in Indianapolis. Over 50 individuals representing a cross-section of faculty, senior campus information technology leaders, national lab directors, and other CI experts attended. The workshop focused on the challenges that must be addressed to build a coherent CI from the local to the national level, and the potential opportunities that would result. Both the organizing committee and the workshop participants hope that some of the ideas, suggestions, and recommendations in this report will take hold and be implemented in the community. The goal is to create a better, more supportive, more usable CI environment in the future to advance both scholarship and research

    Progress Report: Multi-Aperture SAR Target Detection Using Hidden Markov Models

    No full text
    This report highlights our current work and accomplishments on the project to exploit angular diversity for improved target detection in multi-aperture SAR images. This report also contains a brief introduction to hidden Markov models, and identifies issues which we will resolve as work continues. We have analyzed multi-aperture SAR images and demonstrated that anisotropic behavior is present in our multi-aperture SAR image set. We performed baselining studies using the common method of CFAR LTT detection and formulated an HMM detection method using Baum-Welch reestimation to train HMMs to represent target, tree clutter, and ground clutter pixels. Our results show that HMM detection produced significantly better results than CFAR LTT detection (with a 29-by-29 reference window) for the y This research was supported by Wright Laboratory. z The SPANN Lab's WWW URL is http://eewww.eng.ohio-state.edu/research/spann/. same multi-aperture SAR image and requires less computation. Speci..

    Developing Scientific Software through the Open Community Engagement Process

    No full text
    <p>Today's research relies on trustworthy software. Many scientists develop their own software; however, the quality of academia-developed software tends to be lower than commercially-developed software because, in academia, there are barriers to using proven software engineering methods. To help overcome these barriers, the Water Science Software Institute (WSSI) has developed a model--the Open Community Engagement Process (OCEP)--which brings software engineers and scientists together to traverse a four-step, iterative process that incorporates Agile development principles and open source mechanics. As a part of OCEP, WSSI has engaged a water science community who uses a scientist-developed, computational modeling framework originally developed in the early 1990s. Efforts to improve this software have included two hackathons. This paper compares these hackathons to determine factors that influnenced hackathon outcomes. Thorough planning with sufficient lead time before a hackathon, clarification of expectations, sufficient time for discussion of objectives and clarification of domain vocabulary, and co-location of participants were identified as key factors contributing to hackathon success.</p

    Welcome Remarks and South Hub Impacts and Opportunities

    No full text
    Presented on April 9, 2019 at 8:30 a.m. in the Technology Square Research Building (TSRB) Banquet Hall, Georgia Institute of Technology.South Big Data Innovation Hub ; All Hands MeetingChaouki T. Abdallah, PhD., is the Executive Vice President for Research (EVPR) at Georgia Tech and a professor in the School of Electrical and Computer Engineering. As EVPR, Dr. Abdallah, directs Georgia Tech’s research program and serves as chief research officer providing overall leadership for the research, economic development, and related support units within the Institute. He is a proud alumnus of Georgia Tech earning his M.S. and Ph.D. in Electrical Engineering in 1982 and 1988, respectively.Stan Ahalt, PhD. Principal Investigator, South Big Data Regional Innovation Hub – University of North Carolina-Chapel Hill, is the Director of the Renaissance Computing Institute (RENCI) at UNC-Chapel Hill. As Director, he leads a team of research scientists, software and network engineers, data science specialists, and visualization experts who work closely with faculty research teams at UNC, Duke, and NC State as well as with partners across the country. RENCI’s role is to provide enabling cyberinfrastructure to these research collaborations, which entails working on the challenges of data management, sharing, integration, and security. Dr. Ahalt is also a Professor in the UNC Computer Science Department and the Associate Director of the Informatics and Data Science (IDSci) Service in the North Carolina Translational and Clinical Sciences Institute (NC TraCS), UNC’s CTSA award. Dr. Ahalt earned his Ph.D. in Electrical and Computer Engineering from Clemson University and has over 30 years of experience in high performance computing, signal processing, and pattern recognition.Srinivas Aluru, PhD. is a Principal Investigator of the South Big Data Regional Innovation Hub (South Hub), a co-Executive Director of the Georgia Tech Interdisciplinary Research Institute (IRI) in Data Engineering and Science (IDEaS) and a professor in the School of Computational Science and Engineering within the College of Computing at Georgia Tech. Dr. Aluru conducts research in high performance computing, data science, bioinformatics and systems biology, combinatorial scientific computing, and applied algorithms. He pioneered the development of parallel methods in computational biology, and contributed to the assembly and analysis of complex plant genomes.Renata Rawlings-Goss, PhD. Executive Director, South Big Data Regional Innovation Hub – Georgia Institute of Technology. Dr. Rawlings-Goss currently leads the South Big Data Innovation Hub. Formerly, she worked with the White House Office of Science and Technology Policy to create the National Data Science Organizers Group, which facilitates data science groups to address national “Grand Challenge” problems. Dr. Rawlings-Goss was awarded a AAAS fellowship and worked with the National Science Foundation in the directorate of Computer and Information Science and Engineering (CISEOAD) on the Big Data research program, as well as Big Data policies and priority goals for the foundation. She sat on the NITRD interagency Big Data Senior Steering group, charged with strategic planning for Big Data research funded by the federal government. Dr. Rawlings-Goss is a biophysicist by training where her research interests include data-driven analysis of genetic/expression variation among worldwide human populations.Runtime: 56:42 minutesThe SBDH All Hands meetings are organized to bring together the SBDH community in order to foster new and support existing data science collaborations, share best practices and resources for data and Data Science related projects in the priority areas for the southern region. The meeting is an opportunity to find collaborators, share accomplishments, extend your network, offer resources, and take on a leadership role in regional initiatives

    A Framework for the Interoperability of Cloud Platforms: Towards FAIR Data in SAFE Environments

    Full text link
    As the number of cloud platforms supporting scientific research grows, there is an increasing need to support interoperability between two or more cloud platforms, as a growing amount of data is being hosted in cloud-based platforms. A well accepted core concept is to make data in cloud platforms Findable, Accessible, Interoperable and Reusable (FAIR). We introduce a companion concept that applies to cloud-based computing environments that we call a Secure and Authorized FAIR Environment (SAFE). SAFE environments require data and platform governance structures and are designed to support the interoperability of sensitive or controlled access data, such as biomedical data. A SAFE environment is a cloud platform that has been approved through a defined data and platform governance process as authorized to hold data from another cloud platform and exposes appropriate APIs for the two platforms to interoperate.Comment: 16 pages with 2 figure

    Toward a Framework for Evaluating Software Success: A Proposed First Step

    No full text
    <p>Software is a particularly critical technology in many computational science and engineering (CSE) sectors. Consequently, software is increasingly becoming an important component in the evaluation of competitive grants and the execution of research projects. As a result, software can be viewed as a scholarly contribution and has been proposed as a new factor to consider in tenure and promotion processes. However, existing metrics for evaluating the capability, use, reusability, or success of software are sorely lacking. This lack of software metrics permits the development of software based on poor development practices, which in turn allows poorly written software to “fly under the radar” in the scientific community and persist undetected. The absence of evaluation by knowledgeable peers often leads to the establishment and adoption of tools based on aggressive promotion by developers, ease-of-use, and other peripheral factors, hindering the sustainability, usefulness, and uptake of software and even leading to unreliable scientific findings. All of these factors mean that addressing the current lack of software evaluation metrics and methods is not just a question of increasing scientific productivity, but also a matter of preventing poor science.</p> <p>As a first step toward creating a methodology and framework for developing and evolving software success metrics for the CSE community, we propose the creation of a software “peer-review group.” This group, comprised of grant recipients funded to develop sustainable software, would meet periodically to evaluate their own and each others’ software, developing and refining success metrics along the way. We envision the group as a pilot test for a potential larger-scale effort to establish a more formal framework for software success metrics and evaluation.</p
    corecore